Forward-LASSO with Adaptive Shrinkage

نویسندگان

  • Peter Radchenko
  • Gareth M. James
چکیده

Both classical Forward Selection and the more modern Lasso provide computationally feasible methods for performing variable selection in high dimensional regression problems involving many predictors. We note that although the Lasso is the solution to an optimization problem while Forward Selection is purely algorithmic, the two methods turn out to operate in surprisingly similar fashions. Our results demonstrate, both empirically and theoretically, that neither procedure dominates the other. We propose a new method we call Forward-Lasso Adaptive SHrinkage (FLASH), which incorporates both Forward Selection and the Lasso as special cases. FLASH works well in situations where either Forward Selection or the Lasso dominates but also performs well in situations where neither method succeeds. FLASH is fitted using a variant of the computationally efficient LARS algorithm. We provide an extensive theoretical analysis showing that many of the error bounds that have recently been developed for the Lasso can be improved using FLASH. Finally we demonstrate, through numerous simulations and a real world data set, that FLASH generally outperforms many competing approaches. Some key words: Forward Selection; Lasso; Shrinkage; Variable Selection

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improved Variable Selection with Forward - Lasso Adaptive Shrinkage

Recently, considerable interest has focused on variable selection methods in regression situations where the number of predictors, p, is large relative to the number of observations, n. Two commonly applied variable selection approaches are the Lasso, which computes highly shrunk regression coefficients, and Forward Selection, which uses no shrinkage. We propose a new approach, “Forward-Lasso A...

متن کامل

FIRST: Combining forward iterative selection and shrinkage in high dimensional sparse linear regression

We propose a new class of variable selection techniques for regression in high dimensional linear models based on a forward selection version of the LASSO, adaptive LASSO or elastic net, respectively to be called as forward iterative regression and shrinkage technique (FIRST), adaptive FIRST and elastic FIRST. These methods seem to work effectively for extremely sparse high dimensional linear m...

متن کامل

Shrinkage estimation and variable selection in multiple regression models with random coefficient autoregressive errors

In this paper, we consider improved estimation strategies for the parameter vector in multiple regression models with first-order random coefficient autoregressive errors (RCAR(1)). We propose a shrinkage estimation strategy and implement variable selection methods such as lasso and adaptive lasso strategies. The simulation results reveal that the shrinkage estimators perform better than both l...

متن کامل

Variable Inclusion and Shrinkage Algorithms

The Lasso is a popular and computationally efficient procedure for automatically performing both variable selection and coefficient shrinkage on linear regression models. One limitation of the Lasso is that the same tuning parameter is used for both variable selection and shrinkage. As a result, it typically ends up selecting a model with too many variables to prevent over shrinkage of the regr...

متن کامل

Absolute Penalty and Shrinkage Estimation Strategies in Linear and Partially Linear Models

In this dissertation we studied asymptotic properties of shrinkage estimators, and compared their performance with absolute penalty estimators (APE) in linear and partially linear models (PLM). A robust shrinkage M-estimator is proposed for PLM, and asymptotic properties are investigated, both analytically and through simulation studies. In Chapter 2, we compared the performance of shrinkage an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009